φ-divergences and nested models

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Loglinear Models: An approach based on φ-Divergences

In this paper we present a review of some results about inference based on φ-divergence measures, under assumptions of multinomial sampling and loglinear models. The minimum φ-divergence estimator, which is seen to be a generalization of the maximum likelihood estimator is considered. This estimator is used in a φdivergence measure which is the basis of new statistics for solving three importan...

متن کامل

Φ - Divergences , Sufficiency , Bayes Sufficiency , and Deficiency

The paper studies the relations between φ-divergences and fundamental concepts of decision theory such as sufficiency, Bayes sufficiency, and LeCam’s deficiency. A new and considerably simplified approach is given to the spectral representation of φ-divergences already established in Österreicher and Feldman [28] under restrictive conditions and in Liese and Vajda [22], [23] in the general form...

متن کامل

A Note on Integral Probability Metrics and φ-divergences

We study some connections between integral probability metrics [21] of the form, γF(P,Q) := sup f∈F ̨

متن کامل

On Integral Probability Metrics, φ-Divergences and Binary Classification

φ-divergences are a widely studied class of distance measures between probabilities. In this paper, a different class of distance measures on probabilities, called the integral probability metrics (IPMs) is considered. IPMs, for example, the Wasserstein distance and Dudley metric have, thus far, only been used in a limited setting, as theoretical tools in mass transportation problems, in metriz...

متن کامل

deBruijn identities: from Shannon, Kullback–Leibler and Fisher to generalized φ -entropies, φ -divergences and φ -Fisher informations

In this paper we propose a generalization of the usual deBruijn identity that links the Shannon differential entropy (or the Kullback–Leibler divergence) and the Fisher information (or the Fisher divergence) of the output of a Gaussian channel. The generalization makes use of φ -entropies on the one hand, and of φ -divergences (of the Csizàr class) on the other hand, as generalizations of the S...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Applied Mathematics Letters

سال: 1997

ISSN: 0893-9659

DOI: 10.1016/s0893-9659(97)00095-5